Search Results

Documents authored by Ha Vo, Nguyen



Nguyen, Hai H.

Document
Tight Estimate of the Local Leakage Resilience of the Additive Secret-Sharing Scheme & Its Consequences

Authors: Hemanta K. Maji, Hai H. Nguyen, Anat Paskin-Cherniavsky, Tom Suad, Mingyuan Wang, Xiuyu Ye, and Albert Yu

Published in: LIPIcs, Volume 230, 3rd Conference on Information-Theoretic Cryptography (ITC 2022)


Abstract
Innovative side-channel attacks have repeatedly exposed the secrets of cryptosystems. Benhamouda, Degwekar, Ishai, and Rabin (CRYPTO-2018) introduced local leakage resilience of secret-sharing schemes to study some of these vulnerabilities. In this framework, the objective is to characterize the unintended information revelation about the secret by obtaining independent leakage from each secret share. This work accurately quantifies the vulnerability of the additive secret-sharing scheme to local leakage attacks and its consequences for other secret-sharing schemes. Consider the additive secret-sharing scheme over a prime field among k parties, where the secret shares are stored in their natural binary representation, requiring λ bits - the security parameter. We prove that the reconstruction threshold k = ω(log λ) is necessary to protect against local physical-bit probing attacks, improving the previous ω(log λ/log log λ) lower bound. This result is a consequence of accurately determining the distinguishing advantage of the "parity-of-parity" physical-bit local leakage attack proposed by Maji, Nguyen, Paskin-Cherniavsky, Suad, and Wang (EUROCRYPT-2021). Our lower bound is optimal because the additive secret-sharing scheme is perfectly secure against any (k-1)-bit (global) leakage and (statistically) secure against (arbitrary) one-bit local leakage attacks when k = ω(log λ). Any physical-bit local leakage attack extends to (1) physical-bit local leakage attacks on the Shamir secret-sharing scheme with adversarially-chosen evaluation places, and (2) local leakage attacks on the Massey secret-sharing scheme corresponding to any linear code. In particular, for Shamir’s secret-sharing scheme, the reconstruction threshold k = ω(log λ) is necessary when the number of parties is n = O(λ log λ). Our analysis of the "parity-of-parity" attack’s distinguishing advantage establishes it as the best-known local leakage attack in these scenarios. Our work employs Fourier-analytic techniques to analyze the "parity-of-parity" attack on the additive secret-sharing scheme. We accurately estimate an exponential sum that captures the vulnerability of this secret-sharing scheme to the parity-of-parity attack, a quantity that is also closely related to the "discrepancy" of the Irwin-Hall probability distribution.

Cite as

Hemanta K. Maji, Hai H. Nguyen, Anat Paskin-Cherniavsky, Tom Suad, Mingyuan Wang, Xiuyu Ye, and Albert Yu. Tight Estimate of the Local Leakage Resilience of the Additive Secret-Sharing Scheme & Its Consequences. In 3rd Conference on Information-Theoretic Cryptography (ITC 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 230, pp. 16:1-16:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{maji_et_al:LIPIcs.ITC.2022.16,
  author =	{Maji, Hemanta K. and Nguyen, Hai H. and Paskin-Cherniavsky, Anat and Suad, Tom and Wang, Mingyuan and Ye, Xiuyu and Yu, Albert},
  title =	{{Tight Estimate of the Local Leakage Resilience of the Additive Secret-Sharing Scheme \& Its Consequences}},
  booktitle =	{3rd Conference on Information-Theoretic Cryptography (ITC 2022)},
  pages =	{16:1--16:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-238-9},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{230},
  editor =	{Dachman-Soled, Dana},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2022.16},
  URN =		{urn:nbn:de:0030-drops-164943},
  doi =		{10.4230/LIPIcs.ITC.2022.16},
  annote =	{Keywords: leakage resilience, additive secret-sharing, Shamir’s secret-sharing, physical-bit probing leakage attacks, Fourier analysis}
}
Document
P₄-free Partition and Cover Numbers & Applications

Authors: Alexander R. Block, Simina Brânzei, Hemanta K. Maji, Himanshi Mehta, Tamalika Mukherjee, and Hai H. Nguyen

Published in: LIPIcs, Volume 199, 2nd Conference on Information-Theoretic Cryptography (ITC 2021)


Abstract
P₄-free graphs- also known as cographs, complement-reducible graphs, or hereditary Dacey graphs-have been well studied in graph theory. Motivated by computer science and information theory applications, our work encodes (flat) joint probability distributions and Boolean functions as bipartite graphs and studies bipartite P₄-free graphs. For these applications, the graph properties of edge partitioning and covering a bipartite graph using the minimum number of these graphs are particularly relevant. Previously, such graph properties have appeared in leakage-resilient cryptography and (variants of) coloring problems. Interestingly, our covering problem is closely related to the well-studied problem of product (a.k.a., Prague) dimension of loopless undirected graphs, which allows us to employ algebraic lower-bounding techniques for the product/Prague dimension. We prove that computing these numbers is NP-complete, even for bipartite graphs. We establish a connection to the (unsolved) Zarankiewicz problem to show that there are bipartite graphs with size-N partite sets such that these numbers are at least ε⋅N^{1-2ε}, for ε ∈ {1/3,1/4,1/5,...}. Finally, we accurately estimate these numbers for bipartite graphs encoding well-studied Boolean functions from circuit complexity, such as set intersection, set disjointness, and inequality. For applications in information theory and communication & cryptographic complexity, we consider a system where a setup samples from a (flat) joint distribution and gives the participants, Alice and Bob, their portion from this joint sample. Alice and Bob’s objective is to non-interactively establish a shared key and extract the left-over entropy from their portion of the samples as independent private randomness. A genie, who observes the joint sample, provides appropriate assistance to help Alice and Bob with their objective. Lower bounds to the minimum size of the genie’s assistance translate into communication and cryptographic lower bounds. We show that (the log₂ of) the P₄-free partition number of a graph encoding the joint distribution that the setup uses is equivalent to the size of the genie’s assistance. Consequently, the joint distributions corresponding to the bipartite graphs constructed above with high P₄-free partition numbers correspond to joint distributions requiring more assistance from the genie. As a representative application in non-deterministic communication complexity, we study the communication complexity of nondeterministic protocols augmented by access to the equality oracle at the output. We show that (the log₂ of) the P₄-free cover number of the bipartite graph encoding a Boolean function f is equivalent to the minimum size of the nondeterministic input required by the parties (referred to as the communication complexity of f in this model). Consequently, the functions corresponding to the bipartite graphs with high P₄-free cover numbers have high communication complexity. Furthermore, there are functions with communication complexity close to the naïve protocol where the nondeterministic input reveals a party’s input. Finally, the access to the equality oracle reduces the communication complexity of computing set disjointness by a constant factor in contrast to the model where parties do not have access to the equality oracle. To compute the inequality function, we show an exponential reduction in the communication complexity, and this bound is optimal. On the other hand, access to the equality oracle is (nearly) useless for computing set intersection.

Cite as

Alexander R. Block, Simina Brânzei, Hemanta K. Maji, Himanshi Mehta, Tamalika Mukherjee, and Hai H. Nguyen. P₄-free Partition and Cover Numbers & Applications. In 2nd Conference on Information-Theoretic Cryptography (ITC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 199, pp. 16:1-16:25, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{block_et_al:LIPIcs.ITC.2021.16,
  author =	{Block, Alexander R. and Br\^{a}nzei, Simina and Maji, Hemanta K. and Mehta, Himanshi and Mukherjee, Tamalika and Nguyen, Hai H.},
  title =	{{P₄-free Partition and Cover Numbers \& Applications}},
  booktitle =	{2nd Conference on Information-Theoretic Cryptography (ITC 2021)},
  pages =	{16:1--16:25},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-197-9},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{199},
  editor =	{Tessaro, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ITC.2021.16},
  URN =		{urn:nbn:de:0030-drops-143357},
  doi =		{10.4230/LIPIcs.ITC.2021.16},
  annote =	{Keywords: Secure keys, Secure private randomness, Gray-Wyner system, Cryptographic complexity, Nondeterministic communication complexity, Leakage-resilience, Combinatorial optimization, Product dimension, Zarankiewicz problem, Algebraic lower-bounding techniques, P₄-free partition number, P₄-free cover number}
}

Nguyen, Manh Thang

Document
Termination of Programs using Term Rewriting and SAT Solving

Authors: Jürgen Giesl, Peter Schneider-Kamp, René Thiemann, Stephan Swiderski, Manh Thang Nguyen, Daniel De Schreye, and Alexander Serebrenik

Published in: Dagstuhl Seminar Proceedings, Volume 7401, Deduction and Decision Procedures (2007)


Abstract
There are many powerful techniques for automated termination analysis of term rewrite systems (TRSs). However, up to now they have hardly been used for real programming languages. In this talk, we describe recent results which permit the application of existing techniques from term rewriting in order to prove termination of programs. We discuss two possible approaches: 1. One could translate programs into TRSs and then use existing tools to verify termination of the resulting TRSs. 2. One could adapt TRS-techniques to the respective programming languages in order to analyze programs directly. We present such approaches for the functional language Haskell and the logic language Prolog. Our results have been implemented in the termination provers AProVE and Polytool. In order to handle termination problems resulting from real programs, these provers had to be coupled with modern SAT solvers, since the automation of the TRS-termination techniques had to improve significantly. Our resulting termination analyzers are currently the most powerful ones for Haskell and Prolog.

Cite as

Jürgen Giesl, Peter Schneider-Kamp, René Thiemann, Stephan Swiderski, Manh Thang Nguyen, Daniel De Schreye, and Alexander Serebrenik. Termination of Programs using Term Rewriting and SAT Solving. In Deduction and Decision Procedures. Dagstuhl Seminar Proceedings, Volume 7401, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2007)


Copy BibTex To Clipboard

@InProceedings{giesl_et_al:DagSemProc.07401.7,
  author =	{Giesl, J\"{u}rgen and Schneider-Kamp, Peter and Thiemann, Ren\'{e} and Swiderski, Stephan and Nguyen, Manh Thang and De Schreye, Daniel and Serebrenik, Alexander},
  title =	{{Termination of Programs using Term Rewriting and SAT Solving}},
  booktitle =	{Deduction and Decision Procedures},
  series =	{Dagstuhl Seminar Proceedings (DagSemProc)},
  ISSN =	{1862-4405},
  year =	{2007},
  volume =	{7401},
  editor =	{Franz Baader and Byron Cook and J\"{u}rgen Giesl and Robert Nieuwenhuis},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/DagSemProc.07401.7},
  URN =		{urn:nbn:de:0030-drops-12481},
  doi =		{10.4230/DagSemProc.07401.7},
  annote =	{Keywords: Termination, Term Rewriting, Haskell, Prolog, SAT Solving}
}

Nguyen, Lê Thành Dung

Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give a characterization of star-free languages in a λ-calculus with support for non-commutative affine types (in the sense of linear logic), via the algebraic characterization of the former using aperiodic monoids. When the type system is made commutative, we show that we get regular languages instead. A key ingredient in our approach – that it shares with higher-order model checking – is the use of Church encodings for inputs and outputs. Our result is, to our knowledge, the first use of non-commutativity in implicit computational complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 135:1-135:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2020.135,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{Implicit Automata in Typed \lambda-Calculi I: Aperiodicity in a Non-Commutative Logic}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{135:1--135:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.135},
  URN =		{urn:nbn:de:0030-drops-125426},
  doi =		{10.4230/LIPIcs.ICALP.2020.135},
  annote =	{Keywords: Church encodings, ordered linear types, star-free languages}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming)

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We introduce a new approach to implicit complexity in linear logic, inspired by functional database query languages and using recent developments in effective denotational semantics of polymorphism. We give the first sub-polynomial upper bound in a type system with impredicative polymorphism; adding restrictions on quantifiers yields a characterization of logarithmic space, for which extensional completeness is established via descriptive complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming). In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 123:1-123:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2019.123,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{From Normal Functors to Logarithmic Space Queries}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{123:1--123:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.123},
  URN =		{urn:nbn:de:0030-drops-106994},
  doi =		{10.4230/LIPIcs.ICALP.2019.123},
  annote =	{Keywords: coherence spaces, elementary linear logic, semantic evaluation}
}
Document
Unique perfect matchings and proof nets

Authors: Lê Thành Dung Nguyen

Published in: LIPIcs, Volume 108, 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)


Abstract
This paper establishes a bridge between linear logic and mainstream graph theory, building previous work by Retoré (2003). We show that the problem of correctness for MLL+Mix proof nets is equivalent to the problem of uniqueness of a perfect matching. By applying matching theory, we obtain new results for MLL+Mix proof nets: a linear-time correctness criterion, a quasi-linear sequentialization algorithm, and a characterization of the sub-polynomial complexity of the correctness problem. We also use graph algorithms to compute the dependency relation of Bagnol et al. (2015) and the kingdom ordering of Bellin (1997), and relate them to the notion of blossom which is central to combinatorial maximum matching algorithms.

Cite as

Lê Thành Dung Nguyen. Unique perfect matchings and proof nets. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:LIPIcs.FSCD.2018.25,
  author =	{Nguyen, L\^{e} Th\`{a}nh Dung},
  title =	{{Unique perfect matchings and proof nets}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{25:1--25:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.25},
  URN =		{urn:nbn:de:0030-drops-91957},
  doi =		{10.4230/LIPIcs.FSCD.2018.25},
  annote =	{Keywords: correctness criteria, matching algorithms}
}

Nguyễn, Lê Thành Dũng

Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 168, 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)


Abstract
We give a characterization of star-free languages in a λ-calculus with support for non-commutative affine types (in the sense of linear logic), via the algebraic characterization of the former using aperiodic monoids. When the type system is made commutative, we show that we get regular languages instead. A key ingredient in our approach – that it shares with higher-order model checking – is the use of Church encodings for inputs and outputs. Our result is, to our knowledge, the first use of non-commutativity in implicit computational complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. Implicit Automata in Typed λ-Calculi I: Aperiodicity in a Non-Commutative Logic. In 47th International Colloquium on Automata, Languages, and Programming (ICALP 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 168, pp. 135:1-135:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2020.135,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{Implicit Automata in Typed \lambda-Calculi I: Aperiodicity in a Non-Commutative Logic}},
  booktitle =	{47th International Colloquium on Automata, Languages, and Programming (ICALP 2020)},
  pages =	{135:1--135:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-138-2},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{168},
  editor =	{Czumaj, Artur and Dawar, Anuj and Merelli, Emanuela},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2020.135},
  URN =		{urn:nbn:de:0030-drops-125426},
  doi =		{10.4230/LIPIcs.ICALP.2020.135},
  annote =	{Keywords: Church encodings, ordered linear types, star-free languages}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming)

Authors: Lê Thành Dũng Nguyễn and Pierre Pradic

Published in: LIPIcs, Volume 132, 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)


Abstract
We introduce a new approach to implicit complexity in linear logic, inspired by functional database query languages and using recent developments in effective denotational semantics of polymorphism. We give the first sub-polynomial upper bound in a type system with impredicative polymorphism; adding restrictions on quantifiers yields a characterization of logarithmic space, for which extensional completeness is established via descriptive complexity.

Cite as

Lê Thành Dũng Nguyễn and Pierre Pradic. From Normal Functors to Logarithmic Space Queries (Track B: Automata, Logic, Semantics, and Theory of Programming). In 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs), Volume 132, pp. 123:1-123:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2019)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2019.123,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng and Pradic, Pierre},
  title =	{{From Normal Functors to Logarithmic Space Queries}},
  booktitle =	{46th International Colloquium on Automata, Languages, and Programming (ICALP 2019)},
  pages =	{123:1--123:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-109-2},
  ISSN =	{1868-8969},
  year =	{2019},
  volume =	{132},
  editor =	{Baier, Christel and Chatzigiannakis, Ioannis and Flocchini, Paola and Leonardi, Stefano},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2019.123},
  URN =		{urn:nbn:de:0030-drops-106994},
  doi =		{10.4230/LIPIcs.ICALP.2019.123},
  annote =	{Keywords: coherence spaces, elementary linear logic, semantic evaluation}
}
Document
Unique perfect matchings and proof nets

Authors: Lê Thành Dung Nguyen

Published in: LIPIcs, Volume 108, 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)


Abstract
This paper establishes a bridge between linear logic and mainstream graph theory, building previous work by Retoré (2003). We show that the problem of correctness for MLL+Mix proof nets is equivalent to the problem of uniqueness of a perfect matching. By applying matching theory, we obtain new results for MLL+Mix proof nets: a linear-time correctness criterion, a quasi-linear sequentialization algorithm, and a characterization of the sub-polynomial complexity of the correctness problem. We also use graph algorithms to compute the dependency relation of Bagnol et al. (2015) and the kingdom ordering of Bellin (1997), and relate them to the notion of blossom which is central to combinatorial maximum matching algorithms.

Cite as

Lê Thành Dung Nguyen. Unique perfect matchings and proof nets. In 3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018). Leibniz International Proceedings in Informatics (LIPIcs), Volume 108, pp. 25:1-25:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2018)


Copy BibTex To Clipboard

@InProceedings{nguyen:LIPIcs.FSCD.2018.25,
  author =	{Nguyen, L\^{e} Th\`{a}nh Dung},
  title =	{{Unique perfect matchings and proof nets}},
  booktitle =	{3rd International Conference on Formal Structures for Computation and Deduction (FSCD 2018)},
  pages =	{25:1--25:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-077-4},
  ISSN =	{1868-8969},
  year =	{2018},
  volume =	{108},
  editor =	{Kirchner, H\'{e}l\`{e}ne},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2018.25},
  URN =		{urn:nbn:de:0030-drops-91957},
  doi =		{10.4230/LIPIcs.FSCD.2018.25},
  annote =	{Keywords: correctness criteria, matching algorithms}
}

Nguyen, Trung Thanh

Document
Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs

Authors: Trung Thanh Nguyen and Jörg Rothe

Published in: LIPIcs, Volume 181, 31st International Symposium on Algorithms and Computation (ISAAC 2020)


Abstract
We study a generalized version of the load balancing problem on unrelated machines with cost constraints: Given a set of m machines (of certain types) and a set of n jobs, each job j processed on machine i requires p_{i,j} time units and incurs a cost c_{i,j}, and the goal is to find a schedule of jobs to machines, which is defined as an ordered partition of n jobs into m disjoint subsets, in such a way that some objective function of the vector of the completion times of the machines is optimized, subject to the constraint that the total costs by the schedule must be within a given budget B. Motivated by recent results from the literature, our focus is on the case when the number of machine types is a fixed constant and we develop a bi-criteria approximation scheme for the studied problem. Our result generalizes several known results for certain special cases, such as the case with identical machines, or the case with a constant number of machines with cost constraints. Building on the elegant technique recently proposed by Jansen and Maack [K. Jansen and M. Maack, 2019], we construct a more general approach that can be used to derive approximation schemes to a wider class of load balancing problems with constraints.

Cite as

Trung Thanh Nguyen and Jörg Rothe. Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 14:1-14:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ISAAC.2020.14,
  author =	{Nguyen, Trung Thanh and Rothe, J\"{o}rg},
  title =	{{Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{14:1--14:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.14},
  URN =		{urn:nbn:de:0030-drops-133582},
  doi =		{10.4230/LIPIcs.ISAAC.2020.14},
  annote =	{Keywords: bi-criteria approximation algorithm, polynomial-time approximation algorithm, load balancing, machine scheduling}
}

Nguyễn, Lê Thành Dũng (Tito)

Document
Syntactically and Semantically Regular Languages of λ-Terms Coincide Through Logical Relations

Authors: Vincent Moreau and Lê Thành Dũng (Tito) Nguyễn

Published in: LIPIcs, Volume 288, 32nd EACSL Annual Conference on Computer Science Logic (CSL 2024)


Abstract
A fundamental theme in automata theory is regular languages of words and trees, and their many equivalent definitions. Salvati has proposed a generalization to regular languages of simply typed λ-terms, defined using denotational semantics in finite sets. We provide here some evidence for its robustness. First, we give an equivalent syntactic characterization that naturally extends the seminal work of Hillebrand and Kanellakis connecting regular languages of words and syntactic λ-definability. Second, we show that any finitary extensional model of the simply typed λ-calculus, when used in Salvati’s definition, recognizes exactly the same class of languages of λ-terms as the category of finite sets does. The proofs of these two results rely on logical relations and can be seen as instances of a more general construction of a categorical nature, inspired by previous categorical accounts of logical relations using the gluing construction.

Cite as

Vincent Moreau and Lê Thành Dũng (Tito) Nguyễn. Syntactically and Semantically Regular Languages of λ-Terms Coincide Through Logical Relations. In 32nd EACSL Annual Conference on Computer Science Logic (CSL 2024). Leibniz International Proceedings in Informatics (LIPIcs), Volume 288, pp. 40:1-40:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2024)


Copy BibTex To Clipboard

@InProceedings{moreau_et_al:LIPIcs.CSL.2024.40,
  author =	{Moreau, Vincent and Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito)},
  title =	{{Syntactically and Semantically Regular Languages of \lambda-Terms Coincide Through Logical Relations}},
  booktitle =	{32nd EACSL Annual Conference on Computer Science Logic (CSL 2024)},
  pages =	{40:1--40:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-310-2},
  ISSN =	{1868-8969},
  year =	{2024},
  volume =	{288},
  editor =	{Murano, Aniello and Silva, Alexandra},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2024.40},
  URN =		{urn:nbn:de:0030-drops-196831},
  doi =		{10.4230/LIPIcs.CSL.2024.40},
  annote =	{Keywords: regular languages, simple types, denotational semantics, logical relations}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Algebraic Recognition of Regular Functions

Authors: Mikołaj Bojańczyk and Lê Thành Dũng (Tito) Nguyễn

Published in: LIPIcs, Volume 261, 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)


Abstract
We consider regular string-to-string functions, i.e. functions that are recognized by copyless streaming string transducers, or any of their equivalent models, such as deterministic two-way automata. We give yet another characterization, which is very succinct: finiteness-preserving functors from the category of semigroups to itself, together with a certain output function that is a natural transformation.

Cite as

Mikołaj Bojańczyk and Lê Thành Dũng (Tito) Nguyễn. Algebraic Recognition of Regular Functions. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 117:1-117:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bojanczyk_et_al:LIPIcs.ICALP.2023.117,
  author =	{Boja\'{n}czyk, Miko{\l}aj and Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito)},
  title =	{{Algebraic Recognition of Regular Functions}},
  booktitle =	{50th International Colloquium on Automata, Languages, and Programming (ICALP 2023)},
  pages =	{117:1--117:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-278-5},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{261},
  editor =	{Etessami, Kousha and Feige, Uriel and Puppis, Gabriele},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2023.117},
  URN =		{urn:nbn:de:0030-drops-181697},
  doi =		{10.4230/LIPIcs.ICALP.2023.117},
  annote =	{Keywords: string transducers, semigroups, category theory}
}
Document
BV and Pomset Logic Are Not the Same

Authors: Lê Thành Dũng (Tito) Nguyễn and Lutz Straßburger

Published in: LIPIcs, Volume 216, 30th EACSL Annual Conference on Computer Science Logic (CSL 2022)


Abstract
BV and pomset logic are two logics that both conservatively extend unit-free multiplicative linear logic by a third binary connective, which (i) is non-commutative, (ii) is self-dual, and (iii) lies between the "par" and the "tensor". It was conjectured early on (more than 20 years ago), that these two logics, that share the same language, that both admit cut elimination, and whose connectives have essentially the same properties, are in fact the same. In this paper we show that this is not the case. We present a formula that is provable in pomset logic but not in BV.

Cite as

Lê Thành Dũng (Tito) Nguyễn and Lutz Straßburger. BV and Pomset Logic Are Not the Same. In 30th EACSL Annual Conference on Computer Science Logic (CSL 2022). Leibniz International Proceedings in Informatics (LIPIcs), Volume 216, pp. 32:1-32:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2022)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.CSL.2022.32,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito) and Stra{\ss}burger, Lutz},
  title =	{{BV and Pomset Logic Are Not the Same}},
  booktitle =	{30th EACSL Annual Conference on Computer Science Logic (CSL 2022)},
  pages =	{32:1--32:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-218-1},
  ISSN =	{1868-8969},
  year =	{2022},
  volume =	{216},
  editor =	{Manea, Florin and Simpson, Alex},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.CSL.2022.32},
  URN =		{urn:nbn:de:0030-drops-157521},
  doi =		{10.4230/LIPIcs.CSL.2022.32},
  annote =	{Keywords: proof nets, deep inference, pomset logic, system BV, cographs, dicographs, series-parallel orders}
}
Document
Track B: Automata, Logic, Semantics, and Theory of Programming
Comparison-Free Polyregular Functions

Authors: Lê Thành Dũng (Tito) Nguyễn, Camille Noûs, and Pierre Pradic

Published in: LIPIcs, Volume 198, 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)


Abstract
This paper introduces a new automata-theoretic class of string-to-string functions with polynomial growth. Several equivalent definitions are provided: a machine model which is a restricted variant of pebble transducers, and a few inductive definitions that close the class of regular functions under certain operations. Our motivation for studying this class comes from another characterization, which we merely mention here but prove elsewhere, based on a λ-calculus with a linear type system. As their name suggests, these comparison-free polyregular functions form a subclass of polyregular functions; we prove that the inclusion is strict. We also show that they are incomparable with HDT0L transductions, closed under usual function composition - but not under a certain "map" combinator - and satisfy a comparison-free version of the pebble minimization theorem. On the broader topic of polynomial growth transductions, we also consider the recently introduced layered streaming string transducers (SSTs), or equivalently k-marble transducers. We prove that a function can be obtained by composing such transducers together if and only if it is polyregular, and that k-layered SSTs (or k-marble transducers) are closed under "map" and equivalent to a corresponding notion of (k+1)-layered HDT0L systems.

Cite as

Lê Thành Dũng (Tito) Nguyễn, Camille Noûs, and Pierre Pradic. Comparison-Free Polyregular Functions. In 48th International Colloquium on Automata, Languages, and Programming (ICALP 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 198, pp. 139:1-139:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ICALP.2021.139,
  author =	{Nguy\~{ê}n, L\^{e} Th\`{a}nh D\~{u}ng (Tito) and No\^{u}s, Camille and Pradic, Pierre},
  title =	{{Comparison-Free Polyregular Functions}},
  booktitle =	{48th International Colloquium on Automata, Languages, and Programming (ICALP 2021)},
  pages =	{139:1--139:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-195-5},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{198},
  editor =	{Bansal, Nikhil and Merelli, Emanuela and Worrell, James},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ICALP.2021.139},
  URN =		{urn:nbn:de:0030-drops-142087},
  doi =		{10.4230/LIPIcs.ICALP.2021.139},
  annote =	{Keywords: pebble transducers, HDT0L systems, polyregular functions}
}
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail